Past Event: Babuška Forum
Nick Alger, Research associate, Oden institute, UT Austin
10 – 11AM
Friday Dec 3, 2021
POB 6.304 & Zoom
Surrogate models based on low rank approximation of the Jacobian are commonly used for dimension reduction in inverse and optimal control problems governed by partial differential equations (PDEs). Using higher order derivatives in a truncated Taylor series can yield more accurate surrogate models, but this requires constructing compressed representations of higher order derivative tensors. Previously, this was considered intractable because higher order derivative tensors are enormous, and they are only accessible indirectly via contractions with vectors ("tensor actions"). But it is not intractable anymore. We recently developed a randomized “tensor-free" tensor train compression method, which can be used to efficiently construct low-rank approximations of second, third, fourth, fifth, and higher order derivatives of functions that depend implicitly on a high dimensional parameter through the solution of a PDE. The method can be viewed as tensor generalization of the randomized singular value decomposition method for matrices.
In this talk I will (1) review the higher-order adjoint approach for computing actions of derivative tensors for high dimensional implicitly defined functions, (2) provide a brief background on tensor compression and tensor trains, (3) present our new randomized tensor train compression method, and (4) show numerical results in which we use higher order tensor train Taylor series to form more accurate surrogate models of the parameter-to-observable map in a distributed parameter PDE constrained inverse problem.
Nick Alger received his Ph.D. from the Oden institute in 2019. He is currently a research associate at the Oden institute. Nick works on numerical methods for solving inverse problems that arise in the computational geosciences, with a technical focus on Hessian preconditioning and tensor compression.